Mutual Dimension and Random Sequences

نویسندگان

  • Adam Case
  • Jack H. Lutz
چکیده

If S and T are infinite sequences over a finite alphabet, then the lower and upper mutual dimensions mdim(S : T ) and Mdim(S : T ) are the upper and lower densities of the algorithmic information that is shared by S and T . In this paper we investigate the relationships between mutual dimension and coupled randomness, which is the algorithmic randomness of two sequences R1 and R2 with respect to probability measures that may be dependent on one another. For a restricted but interesting class of coupled probability measures we prove an explicit formula for the mutual dimensions mdim(R1 : R2) and Mdim(R1 : R2), and we show that the condition Mdim(R1 : R2) = 0 is necessary but not sufficient for R1 and R2 to be independently random. We also identify conditions under which Billingsley generalizations of the mutual dimensions mdim(S : T ) and Mdim(S : T ) can be meaningfully defined; we show that under these conditions these generalized mutual dimensions have the “correct” relationships with the Billingsley generalizations of dim(S), Dim(S), dim(T ), and Dim(T ) that were developed and applied by Lutz and Mayordomo; and we prove a divergence formula for the values of these generalized mutual dimensions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Measuring Statistical Dependence via the Mutual Information Dimension

We propose to measure statistical dependence between two random variables by the mutual information dimension (MID), and present a scalable parameter-free estimation method for this task. Supported by sound dimension theory, our method gives an effective solution to the problem of detecting interesting relationships of variables in massive data, which is nowadays a heavily studied topic in many...

متن کامل

Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information

Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...

متن کامل

Repeat Sequences and Base Correlations in Human Y Chromosome Palindromes

On the basis of information theory and statistical methods, we use mutual information, ntuple entropy and conditional entropy, combined with biological characteristics, to analyze the long range correlation and short range correlation in human Y chromosome palindromes. The magnitude distribution of the long range correlation which can be reflected by the mutual information is P5>P5a>P5b (P5a an...

متن کامل

A Kolmogorov complexity characterization of constructive Hausdorff dimension

Lutz [7] has recently developed a constructive version of Hausdorff dimension, using it to assign to every sequence A ∈ C a constructive dimension dim(A) ∈ [0,1]. Classical Hausdorff dimension [3] is an augmentation of Lebesgue measure, and in the same way constructive dimension augments Martin– Löf randomness. All Martin–Löf random sequences have constructive dimension 1, while in the case of ...

متن کامل

Lowness for Effective Hausdorff Dimension

We examine the sequences A that are low for dimension, i.e., those for which the effective (Hausdorff) dimension relative to A is the same as the unrelativized effective dimension. Lowness for dimension is a weakening of lowness for randomness, a central notion in effective randomness. By considering analogues of characterizations of lowness for randomness, we show that lowness for dimension ca...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015